Hopfield learning rule with high capacity storage of time-correlated patterns
نویسندگان
چکیده
منابع مشابه
Energy Relaxation for Hopfield Network with the New Learning Rule
In this paper, the time for energy relaxation for LittleHopfield neural network using the new activation rule is shown to be better than the relaxation time using Hebbian learning. However, this should be so given the characteristics of the activation function and show through computer simulations that this is indeed so. In this paper, it has been proven that the new learning rule has a higher ...
متن کاملOn the Maximum Storage Capacity of the Hopfield Model
Recurrent neural networks (RNN) have traditionally been of great interest for their capacity to store memories. In past years, several works have been devoted to determine the maximum storage capacity of RNN, especially for the case of the Hopfield network, the most popular kind of RNN. Analyzing the thermodynamic limit of the statistical properties of the Hamiltonian corresponding to the Hopfi...
متن کاملStorage Capacity of Letter Recognition in Hopfield Networks
Associative memory is a dynamical system which has a number of stable states with a domain of attraction around them [1]. If the system starts at any state in the domain, it will converge to the locally stable state, which is called an attractor. In 1982, Hopfield [2] proposed a fully connected neural network model of associative memory in which patterns can be stored by distributed among neuro...
متن کاملStorage of correlated patterns in a perceptron
We calculate the storage capacity of a perceptron for correlated Gaussian patterns. We find that the storage capacity αc can be less than 2 if similar patterns are mapped onto different outputs and vice versa. As long as the patterns are in a general position we obtain, in contrast to previous works, that αc ≥ 1 in agreement with Cover’s theorem. Numerical simulations confirm the results. The c...
متن کاملAutoassociative Memory with high Storage Capacity
The general neural unit (GNU) 1] is known for its high storage capacity as an autoassociative memory. The exponential increase in its storage capacity with the number of inputs per neuron is far greater than the linear growth in the famous Hoppeld network 2]. This paper shows that the GNU attains an even higher capacity with the use of pyramids of neurons instead of single neurons as its nodes....
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Electronics Letters
سال: 1997
ISSN: 0013-5194
DOI: 10.1049/el:19971233